# Lightweight fine-tuning
Qvikhr 3 1.7B Instruction Noreasoning
Apache-2.0
QVikhr-3-1.7B-Instruction-noreasoning is an instruction model based on Qwen/Qwen3-1.7B, trained on the Russian dataset GrandMaster2, and is specifically designed for efficient processing of Russian and English texts.
Large Language Model
Transformers

Q
Vikhrmodels
274
10
Mt5 Small Finetuned Gazeta Ru
Apache-2.0
A Russian abstract generation model fine-tuned on the gazeta dataset based on google/mt5-small
Text Generation
TensorBoard

M
sansmislom
33
0
Mistral Portuguese Luana 7b Chat
Apache-2.0
A fine-tuned Mistral 7B model trained on 250,000 Portuguese chat datasets, optimized for Portuguese chat scenarios
Large Language Model
Transformers

M
rhaymison
391
5
Mamba 370m Hf
Mamba is an efficient language model based on the State Space Model (SSM), with the ability to model sequences with linear time complexity.
Large Language Model
Transformers

M
state-spaces
6,895
14
Tiny Vicuna 1B GGUF
Tiny-Vicuna-1B is a lightweight model fine-tuned from TinyLLama 1.1B using the WizardVicuna dataset, designed for early-stage experimental iterations.
Large Language Model
T
afrideva
208.74k
6
Tiny Vicuna 1B
Apache-2.0
Tiny Vicuna 1B is a fine-tuned version based on TinyLlama using the WizardVicuna dataset, compatible with the Vicuna-v1.5 series, suitable for early experimental iterations.
Large Language Model
Transformers English

T
Jiayi-Pan
1,247
15
Mistral 7b Guanaco
Apache-2.0
A pre-trained language model based on the Llama2 architecture, suitable for English text generation tasks
Large Language Model
Transformers English

M
kingabzpro
67
3
Koreanlm 3B
KoreanLM is an open-source project dedicated to developing Korean language models, aiming to address the scarcity of Korean learning resources and inefficient tokenization processing.
Large Language Model
Transformers Supports Multiple Languages

K
quantumaikr
33
2
Koreanlm
KoreanLM is an open-source language model project specifically optimized for Korean, designed for Korean grammar and vocabulary characteristics, providing efficient tokenization solutions
Large Language Model
Transformers Supports Multiple Languages

K
quantumaikr
59
28
T5 Spanish Efficient Tiny
Apache-2.0
This is an efficient micro T5 model optimized for Spanish, with a small size (<29MB), suitable for CPU usage, requires fine-tuning before use
Large Language Model
Transformers Spanish

T
jalbarracin
269
4
Codet5 Small Generate Docstrings For Python Condensed
Apache-2.0
A model fine-tuned based on Salesforce/codet5-small for generating docstrings for Python functions
Text Generation
Transformers English

C
DunnBC22
20
4
Tiny Random T5ForConditionalGeneration Calibrated
An optimized and calibrated mini-T5 model suitable for text generation and transformation tasks, featuring lightweight and efficient characteristics.
Large Language Model
Transformers

T
ybelkada
581.45k
1
Gpt Neo 2.7B 8bit
MIT
This is a modified version of EleutherAI's GPT-Neo (2.7B parameter version) that supports text generation and model fine-tuning on Colab or equivalent desktop GPUs.
Large Language Model
Transformers English

G
gustavecortal
99
11
Featured Recommended AI Models